An Improved Bound on the VC-Dimension of Neural Networks with Polynomial Activation Functions

نویسندگان

  • J. Maurice Rojas
  • M. Vidyasagar
چکیده

We derive an improved upper bound for the VC-dimension of neural networks with polynomial activation functions. This improved bound is based on a result of Rojas [Roj00] on the number of connected components of a semi-algebraic set.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tight Bounds for the VC-Dimension of Piecewise Polynomial Networks

O(ws(s log d+log(dqh/ s))) and O(ws((h/ s) log q) +log(dqh/ s)) are upper bounds for the VC-dimension of a set of neural networks of units with piecewise polynomial activation functions, where s is the depth of the network, h is the number of hidden units, w is the number of adjustable parameters, q is the maximum of the number of polynomial segments of the activation function, and d is the max...

متن کامل

Bounding VC-dimension of neural networks: Progress and prospects

Techniques from diierential topology are used to give polynomial bounds for the VC-dimension of sigmoidal neural networks. The bounds are quadratic in w, the dimension of the space of weights. Similar results are obtained for a wide class of Pfaaan activation functions. The obstruction (in diierential topology) to improving the bound to an optimal bound O (w log w) is discussed, and attention i...

متن کامل

Polynomial Bounds for the VC-Dimension of Sigmoidal, Radial Basis Function, and Sigma-pi Networks

W 2 h 2 is an asymptotic upper bound for the VC-dimension of a large class of neural networks including sigmoidal, radial basis functions, and sigma-pi networks, where h is the number of hidden units and W is the number of adjustable parameters, which extends Karpinski and Macintyre's resent results.* The class is characterized by polynomial input functions and activation functions that are sol...

متن کامل

Polynomial Bounds for VC Dimension of Sigmoidal and General Pfaffian Neural Networks

We introduce a new method for proving explicit upper bounds on the VC Dimension of general functional basis networks, and prove as an application, for the rst time, that the VC Dimension of analog neural networks with the sigmoidal activation function (y) = 1=1+e ?y is bounded by a quadratic polynomial O((lm) 2) in both the number l of programmable parameters, and the number m of nodes. The pro...

متن کامل

Vapnik-Chervonenkis Dimension of Recurrent Neural Networks

Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, po...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002